πŸ•ΈοΈ Ada Research Browser

IMPLEMENTATION_PLAN.md
← Back

Implementation Plan: Ideal Report Structure

Version: 1.0 Created: 2026-03-08 Branch: feature/ideal-report-structure Reference: docs/IDEAL_REPORT_STRUCTURE.md


Implementation Phases

Phase 1: Database Schema (Foundation)

Priority: CRITICAL Dependencies: None Estimated Effort: 2-3 hours

Tasks: 1. Create new schema file: sql/05-scan-registry-schema.sql - blueteam.scans table - blueteam.attack_catalog table - blueteam.attack_compliance table - blueteam.scan_attacks table - blueteam.findings table - blueteam.mitigation_findings (linking table) - blueteam.scan_comparisons table - All indexes and foreign keys

  1. Create migration script: sql/migrate-to-scan-registry.sql
  2. Preserve existing mitigation_projects data
  3. Create backward compatibility views
  4. Map existing data to new schema

  5. Apply schema to Alfred database

  6. Test on development database first
  7. Validate constraints and indexes
  8. Verify migration preserves data

  9. Update database permissions

  10. Grant alfred_admin access to new tables
  11. Update blueteam schema grants

Deliverables: - [x] sql/05-scan-registry-schema.sql - [ ] sql/migrate-to-scan-registry.sql - [ ] Schema applied to Alfred database - [ ] Migration tested and verified


Phase 2: Attack Catalog Population

Priority: HIGH Dependencies: Phase 1 Estimated Effort: 2-3 hours

Tasks: 1. Create catalog builder script: scripts/build-attack-catalog.py - Scan all attack modules in redteam/attacks/ - Extract: name, category, description, severity - Extract target_types from class attributes - Generate SQL INSERT statements

  1. Extract compliance mappings
  2. Parse docstrings for NIST/framework references
  3. Parse comments for control IDs
  4. Create attack_compliance entries

  5. Populate catalog table

  6. Run catalog builder
  7. Insert into blueteam.attack_catalog
  8. Insert into blueteam.attack_compliance

  9. Create catalog update workflow

  10. Add pre-commit hook to regenerate catalog
  11. Document manual update process
  12. Add validation tests

Deliverables: - [ ] scripts/build-attack-catalog.py - [ ] Attack catalog populated - [ ] Compliance mappings populated - [ ] Catalog update documentation


Phase 3: Enhanced Report Generation

Priority: HIGH Dependencies: Phase 1, Phase 2 Estimated Effort: 3-4 hours

Tasks: 1. Update config.yaml structure - Add target.name field - Add target.environment field (production/staging/dev) - Add target.type field (app/wordpress/static/cloud) - Add execution.operator field

  1. Create new reporter: redteam/reporters/hierarchical_json.py
  2. Generate scan_metadata section
  3. Generate hierarchical attacks array with nested variants
  4. Include attack descriptions from catalog
  5. Include compliance mappings
  6. Add scan_config snapshot
  7. Maintain backward compatibility with flat findings array

  8. Update runner to use new reporter

  9. Generate unique scan_id (UUID or timestamp-based)
  10. Capture config snapshot
  11. Pass target metadata to reporter
  12. Track execution metadata (start/end times)

  13. Update JsonReporter for backward compatibility

  14. Keep existing format as fallback
  15. Add deprecation warning
  16. Plan removal in v2.0

Deliverables: - [ ] Updated config.yaml structure - [ ] redteam/reporters/hierarchical_json.py - [ ] Updated redteam/runner.py - [ ] Backward compatibility maintained - [ ] Generated report validated


Phase 4: Database Integration

Priority: HIGH Dependencies: Phase 2, Phase 3 Estimated Effort: 2-3 hours

Tasks: 1. Create scan importer: scripts/import-scan-to-db.py - Parse hierarchical JSON report - Insert into blueteam.scans - Insert into blueteam.scan_attacks - Insert into blueteam.findings - Link to attack_catalog - Handle duplicate scans gracefully

  1. Update mitigation import workflow
  2. Create findings first
  3. Link findings to mitigation_issues via mitigation_findings
  4. Preserve existing mitigation workflow
  5. Maintain backward compatibility

  6. Create API endpoints for scan data

  7. /api/scans/list - List all scans
  8. /api/scans/{id} - Get scan details
  9. /api/scans/{id}/findings - Get findings for scan
  10. /api/scans/compare?baseline={id}&current={id} - Compare scans

  11. Update dashboard to consume new APIs

  12. Add scan selector dropdown
  13. Show scan metadata (target, date, duration)
  14. Display hierarchical attack results
  15. Link findings to mitigation issues

Deliverables: - [ ] scripts/import-scan-to-db.py - [ ] Updated import workflow - [ ] New API endpoints - [ ] Dashboard integration - [ ] Data integrity verified


Phase 5: Scan Comparison & History

Priority: MEDIUM Dependencies: Phase 4 Estimated Effort: 3-4 hours

Tasks: 1. Create comparison engine: blueteam/scan_comparison.py - Compare two scans (baseline vs. current) - Detect new vulnerabilities - Detect fixed vulnerabilities - Detect regressions (fixed β†’ vulnerable) - Detect improvements (vulnerable β†’ defended) - Calculate trend scores

  1. Create comparison API endpoints
  2. /api/scans/compare - Generate comparison
  3. /api/scans/{id}/history - Get historical comparisons
  4. /api/scans/trends - Aggregate trends over time

  5. Add dashboard comparison view

  6. Side-by-side scan comparison
  7. Highlight new/fixed/regression findings
  8. Trend graphs (vulnerability count over time)
  9. Severity distribution changes

  10. Create automated regression detection

  11. Run comparison after each scan
  12. Alert on regressions
  13. Track improvement metrics

Deliverables: - [ ] blueteam/scan_comparison.py - [ ] Comparison API endpoints - [ ] Dashboard comparison view - [ ] Automated regression alerts - [ ] Trend visualization


Phase 6: Testing & Validation

Priority: HIGH Dependencies: All previous phases Estimated Effort: 2-3 hours

Tasks: 1. Unit tests - Test hierarchical_json reporter - Test scan importer - Test comparison engine - Test API endpoints

  1. Integration tests
  2. Run full scan with new structure
  3. Import to database
  4. Query via API
  5. Verify dashboard display

  6. Migration testing

  7. Test migration script on copy of production data
  8. Verify no data loss
  9. Validate foreign key integrity
  10. Test rollback procedure

  11. Performance testing

  12. Benchmark database queries
  13. Test with large scan datasets
  14. Optimize slow queries
  15. Add necessary indexes

Deliverables: - [ ] Unit test suite - [ ] Integration tests passing - [ ] Migration validated - [ ] Performance benchmarks - [ ] Optimization complete


Phase 7: Documentation & Deployment

Priority: HIGH Dependencies: All previous phases Estimated Effort: 1-2 hours

Tasks: 1. Update README.md - Document new report structure - Update scan configuration guide - Add migration instructions - Document new API endpoints

  1. Create operator guide
  2. How to run scans with new structure
  3. How to compare scans
  4. How to interpret trends
  5. Troubleshooting guide

  6. Deploy to Alfred

  7. Apply database migration
  8. Deploy updated code
  9. Run test scan
  10. Verify dashboard

  11. Deploy to cp.quigs.com (webhost)

  12. Apply database migration
  13. Deploy updated code
  14. Import historical scans
  15. Verify integration

Deliverables: - [ ] Updated README.md - [ ] Operator guide created - [ ] Deployed to Alfred - [ ] Deployed to webhost - [ ] Historical data migrated


Risk Mitigation

Database Migration Risks

Risk: Data loss during migration Mitigation: - Full database backup before migration - Test migration on copy first - Rollback script prepared - Validation queries to verify data integrity

Backward Compatibility Risks

Risk: Breaking existing workflows Mitigation: - Maintain old report format in parallel - Create compatibility views in database - Gradual deprecation timeline - Clear migration documentation

Performance Risks

Risk: New structure slower than old Mitigation: - Benchmark before/after - Optimize indexes - Use materialized views for aggregations - Cache frequently-accessed data


Success Criteria

  1. All scans generate hierarchical reports with complete metadata
  2. Database contains full scan history with no data loss
  3. Dashboard displays scans with filtering and comparison
  4. API returns scan data in <200ms for typical queries
  5. Comparison engine detects regressions and improvements
  6. Documentation is complete and tested by another operator
  7. All tests pass with >80% code coverage
  8. Production deployment successful with zero downtime

Timeline

Total Estimated Effort: 15-20 hours

Proposed Schedule: - Day 1: Phase 1 (Database Schema) + Phase 2 (Attack Catalog) - Day 2: Phase 3 (Report Generation) + Phase 4 (Database Integration) - Day 3: Phase 5 (Comparison) + Phase 6 (Testing) + Phase 7 (Deployment)

Checkpoints: - End of Day 1: Database schema complete, catalog populated - End of Day 2: Reports generating, data importing to DB - End of Day 3: Full deployment with historical data migrated


Rollback Plan

If critical issues arise:

  1. Revert database migration sql -- Run rollback script psql -U alfred_admin -d alfred_admin -f sql/rollback-scan-registry.sql

  2. Revert code changes bash git checkout main git branch -D feature/ideal-report-structure

  3. Restore backup bash pg_restore -U alfred_admin -d alfred_admin backup-YYYYMMDD.dump

  4. Verify old workflow

  5. Run scan with old format
  6. Check existing mitigation dashboard
  7. Validate reports directory

Artemis Server Migration Instructions

Artemis is Brandon's production server running WordPress and other services. The cyber-guardian scanner will be deployed to Artemis for production security monitoring.

Prerequisites

  1. PostgreSQL Database Setup bash # Create database and user (if not exists) sudo -u postgres createdb artemis_security sudo -u postgres createuser artemis_admin sudo -u postgres psql -c "GRANT ALL PRIVILEGES ON DATABASE artemis_security TO artemis_admin;"

  2. Create blueteam schema bash sudo -u postgres psql artemis_security -c "CREATE SCHEMA IF NOT EXISTS blueteam;" sudo -u postgres psql artemis_security -c "GRANT ALL ON SCHEMA blueteam TO artemis_admin;"

Migration Steps

Step 1: Deploy Code

# Clone or pull cyber-guardian repo
cd /opt/claude-workspace/projects
git clone https://github.com/Quig-Enterprises/cyber-guardian.git
cd cyber-guardian
git checkout feature/ideal-report-structure

# Install dependencies
pip install -r requirements.txt

Step 2: Apply Database Schema

# Apply scan registry schema
sudo -u postgres psql artemis_security -f sql/05-scan-registry-schema.sql

# Populate attack catalog
python3 scripts/build-attack-catalog.py | sudo -u postgres psql artemis_security

Step 3: Configure Scanner

# Create Artemis-specific config
cp config.yaml config_artemis.yaml

# Edit config_artemis.yaml:
# - Set target.base_url to Artemis WordPress URL
# - Set target.name to "Artemis Production"
# - Set target.environment to "production"
# - Set target.type to "wordpress"
# - Set database connection to artemis_security

Step 4: Run Initial Scan

# Test scan (dry-run)
python3 redteam/cli.py --config config_artemis.yaml --dry-run

# Full production scan
python3 redteam/cli.py --config config_artemis.yaml

# Import results to database
python3 scripts/import-scan-to-db.py \
  --latest \
  --database artemis_security \
  --user artemis_admin

Step 5: Set Up Automated Scanning

# Create cron job for daily scans
sudo crontab -e

# Add line:
0 2 * * * cd /opt/claude-workspace/projects/cyber-guardian && python3 redteam/cli.py --config config_artemis.yaml && python3 scripts/import-scan-to-db.py --latest --database artemis_security --user artemis_admin

Step 6: Deploy Dashboard (Optional)

# If deploying security dashboard on Artemis:
# 1. Copy dashboard/ to web-accessible directory
# 2. Update dashboard/api/*.php with artemis_security database credentials
# 3. Configure nginx/apache to serve dashboard
# 4. Test at https://artemis.quigs.com/security-dashboard/

Artemis-Specific Notes

Validation Checklist


Notes